Anderson Acceleration for First-Order Methods in Visual Computing
Juyong Zhang
28-Dec-2020, 07:45-08:30 (5 years ago)
Abstract: Alternating optimization methods like local-global solver and alternating direction multiplier method (ADMM) are commonly used in areas like signal processing, machine learning and computer graphics. These methods converge quickly to an approximate solution, but can take a long time to converge to a solution of high-accuracy. In this talk, I will present our works about applying Anderson acceleration to speed up the convergence of these methods by treating them as fixed-point iteration. We also analyze the convergence of the proposed acceleration method on nonconvex problems, and verify its effectiveness on a variety of problems.
Mathematics
Audience: researchers in the topic
| Organizers: | Shing Tung Yau, Shiu-Yuen Cheng, Sen Hu*, Mu-Tao Wang |
| *contact for this listing |
Export talk to
